# Efficient MoE Architecture
Dots.llm1.inst
MIT
dots.llm1 is a large-scale MoE model that activates 14 billion parameters out of a total of 142 billion parameters, and its performance is comparable to that of the state-of-the-art models.
Large Language Model
Transformers Supports Multiple Languages

D
rednote-hilab
440
97
Meta Llama Llama 4 Maverick 17B 128E Instruct
Other
Llama 4 Maverick is a multimodal AI model released by Meta, supporting text and image understanding. It adopts a Mixture of Experts (MoE) architecture and excels in multilingual text and code generation tasks.
Multimodal Fusion
Transformers Supports Multiple Languages

M
Undi95
35
2
Featured Recommended AI Models